skip to main content
US FlagAn official website of the United States government
dot gov icon
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
https lock icon
Secure .gov websites use HTTPS
A lock ( lock ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.


Search for: All records

Creators/Authors contains: "Delmonaco, Daniel"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Content creators with marginalized identities are disproportionately affected by shadowbanning on social media platforms, which impacts their economic prospects online. Through a diary study and interviews with eight marginalized content creators who are women, pole dancers, plus size, and/or LGBTQIA+, this paper examines how content creators with marginalized identities experience shadowbanning. We highlight the labor and economic inequalities of shadowbanning, and the resulting invisible online labor that marginalized creators often must perform. We identify three types of invisible labor that marginalized content creators engage in to mitigate shadowbanning and sustain their online presence: mental and emotional labor, misdirected labor, and community labor. We conclude that even though marginalized content creators engaged in cross-platform collaborative labor and personal mental/emotional labor to mitigate the impacts of shadowbanning, it was insufficient to prevent uncertainty and economic precarity created by algorithmic opacity and ambiguity. 
    more » « less
    Free, publicly-accessible full text available January 10, 2026
  2. Research suggests that marginalized social media users face disproportionate content moderation and removal. However, when content is removed or accounts suspended, the processes governing content moderation are largely invisible, making assessing content moderation bias difficult. To study this bias, we conducted a digital ethnography of marginalized users on Reddit’s /r/FTM subreddit and Twitch’s “Just Chatting” and “Pools, Hot Tubs, and Beaches” categories, observing content moderation visibility in real time. We found that on Reddit, a text-based platform, platform tools make content moderation practices invisible to users, but moderators make their practices visible through communication with users. Yet on Twitch, a live chat and streaming platform, content moderation practices are visible in channel live chats, “unban appeal” streams, and “back from my ban” streams. Our ethnography shows how content moderation visibility differs in important ways between social media platforms, harming those who must see offensive content, and at other times, allowing for increased platform accountability. 
    more » « less